6.1 Structure and Quantity

55

whose asymptotic limit (n right arrow normal infinityn →∞) Shannon calls “entropy of the source”, is a measure

of the information in theleft parenthesis n plus 1 right parenthesis(n + 1)th symbol, assuming thenn previous ones are known.

The decay of upper I overTilde Subscript n ˜In quantifies correlations within the symbolic sequence (an aspect of

memory).

6.1

Structure and Quantity

In our discussion so far we have tacitly assumed that we know a priori the set from

which the actual measurement will come. In an actual physical experiment, this is

like knowing from which dial we shall take readings of the position of the pointer,

for example, and, furthermore, this knowledge may comprise all the information

required to construct and use the meter, which is far more than that needed to formally

specify the circuit diagram and other details of the construction. It would also have to

include blueprints for the machinery needed to make the mechanical and electronic

components, for manufacturing the required materials from available matter, and so

forth. In many cases we do not need to concern ourselves about all this, because we

are only interested in the gain in information (i.e., loss of uncertainty) obtained by

receiving the result of the dial reading, which is given by Eq. (6.5). The information

pertinent to the construction of the experiment usually remains the same, hence

cancels out (Eq. 6.7). In other words, the Shannon–Weaver index is strictly concerned

with the metrical aspects of information, not with its structure.

6.1.1

The Generation of Information

Prior to carrying out an experiment, or an observation, there is objective uncertainty

due to the fact that several possibilities (for the result) have to be taken into account.

The information furnished by the outcome of the experiment reduces this uncertainty:

R.A. Fisher defined the quantity of information furnished by a series of repeated

measurements as the reciprocal of the variance:

upper I Subscript normal upper F Baseline left parenthesis x right parenthesis less than or equals 1 divided by left angle bracket left parenthesis x Subscript normal e normal s normal t Baseline minus x right parenthesis squared right angle bracketIF(x)1/((xestx)2)

(6.12)

where upper I Subscript normal upper FIF is the Fisher information and the denominator of the right-hand side is the

variance of the estimatorx Subscript normal e normal s normal txest. 6 One use ofupper I Subscript normal upper FIF is to measure the encoding accuracy of a

population of neurons subject to some stimulus (Chap. 24); maximizingupper I Subscript normal upper FIFoptimizes

extraction of the value of the stimulus. 7

6 The relation between the Shannon index and Fisher’s information, which refers to the intrinsic

accuracy of an experimental result, is treated by Kullback and Leibler (1951).

7 An example is given by Karbowski (2000).